Skip to content

feat: add support for application templates#660

Open
razvan wants to merge 12 commits intomainfrom
feat/app-templates
Open

feat: add support for application templates#660
razvan wants to merge 12 commits intomainfrom
feat/app-templates

Conversation

@razvan
Copy link
Member

@razvan razvan commented Feb 25, 2026

@razvan razvan self-assigned this Feb 25, 2026
@razvan razvan moved this to Development: In Progress in Stackable Engineering Feb 25, 2026
@razvan razvan marked this pull request as ready for review February 27, 2026 09:08
@razvan razvan moved this from Development: In Progress to Development: Waiting for Review in Stackable Engineering Feb 27, 2026
@razvan razvan requested a review from a team February 27, 2026 09:10
@adwk67 adwk67 requested review from adwk67 and removed request for a team February 27, 2026 14:51
@adwk67 adwk67 moved this from Development: Waiting for Review to Development: In Review in Stackable Engineering Feb 27, 2026
Copy link
Member

@adwk67 adwk67 left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Not finished yet, but here are some initial comments.

----
<1> The kind of the resource is `SparkApplicationTemplate` to indicate that this is an application template.
<2> Name of the application template.
<3> The value of `mainApplicationFile` is set to a placeholder value, which will be overridden by the application resource. Similarly to the application, The fields `sparkImage`, `mode`, `mainClass`, and `mainApplicationFile` are required for the template to be valid.
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Why do we have to have define sparkImage etc. both in the template and the app? Is the plan with v1alpha2 to remove them as being mandatory in the application so we can only have them in the template (or override them)? If so, can they be optional in the template?


Spark application templates are used to define reusable configurations for Spark applications.
When you have many applications with similar configurations, templates can help you avoid duplication by grouping common settings together.
Application templates are available for the `v1alpha1` version of the SparkApplication custom resource and share the exact same structure as the SparkApplication resource, but with some differences in the way the operator handles them:
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Do we know yet if the plan is to keep versions in step with another (e.g a v1alpha2 is created for one entity when the other entity is bumped to v1alpha2). I think that might make mapping between app and template a little easier.

When you have many applications with similar configurations, templates can help you avoid duplication by grouping common settings together.
Application templates are available for the `v1alpha1` version of the SparkApplication custom resource and share the exact same structure as the SparkApplication resource, but with some differences in the way the operator handles them:

1. Application templates are cluster wide resources, while Spark application resources are namespace-scoped. This means that application templates can be used across multiple namespaces, while Spark application resources are limited to the namespace they are created in.
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Sentences should be on new lines in the docs e.g.

Suggested change
1. Application templates are cluster wide resources, while Spark application resources are namespace-scoped. This means that application templates can be used across multiple namespaces, while Spark application resources are limited to the namespace they are created in.
1. Application templates are cluster wide resources, while Spark application resources are namespace-scoped.
This means that application templates can be used across multiple namespaces, while Spark application resources are limited to the namespace they are created in.

(not sure if we want indenting there...)

mainApplicationFile: "/examples.jar"
----
<1> Enable application template merging for this application.
<2> The name of the application templates to reference. The settings from these templates will be merged together in the order they are referenced, with `app-template-0` having the lowest precedence and `app-template-2` having the highest precedence. Tha application fields have the highest overall precedence and will override any conflicting settings from the templates.
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
<2> The name of the application templates to reference. The settings from these templates will be merged together in the order they are referenced, with `app-template-0` having the lowest precedence and `app-template-2` having the highest precedence. Tha application fields have the highest overall precedence and will override any conflicting settings from the templates.
<2> The name of the application templates to reference. The settings from these templates will be merged together in the order they are referenced, with `app-template-0` having the lowest precedence and `app-template-2` having the highest precedence. The application fields have the highest overall precedence and will override any conflicting settings from the templates.

metadata:
name: app
annotations:
spark-application.template.merge: "true" # <1>
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I think it would be good to include all available annotations in the example to be comprehensive.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

Status: Development: In Review

Development

Successfully merging this pull request may close these issues.

Spark: shared Spark configuration across applications

3 participants